A Class of Renyi Information Estimators for Multidimensional Densities

نویسنده

  • NIKOLAI LEONENKO
چکیده

In a recent paper [1], Leonenko, Pronzato and Savani consider the estimation of the Rényi and Tsallis entropies, respectively, H ∗ q = 1 1−q log ∫ Rm f (x) dx and Hq = 1 q−1(1 − ∫ Rm f (x) dx), q = 1, of an unknown probability measure with density f on R with respect to the Lebesgue measure though kth nearestneighbor distances in a sample X1, . . . ,XN i.i.d. with the density f . The results in [1] about the asymptotic unbiasedness and consistency of the estimator proposed for Iq = E{f q−1(X)} = ∫Rm f (x) dx are correct for q > 1 but, for q < 1, convergence in distribution should be complemented by additional arguments to obtain the required convergence of moments. Following [1], define ÎN,k,q = 1 N ∑N i=1(ζN,i,k)1−q, with ζN,i,k = (N − 1)Ck × Vm(ρ (i) k,N−1), where Vm = πm/2/ (m/2 + 1) is the volume of the unit ball B(0,1) in R, Ck = [ (k) (k+1−q) ]1/(1−q) and ρ k,N−1 denote the kth nearest-neighbor distance from a given Xi to some other Xj in the sample X1, . . . ,XN . Also define rc(f ) = sup{r > 0 : ∫Rm |x|rf (x) dx <∞}, so that E|Xi |r <∞ if r < rc(f ) and E|Xi |r =∞ if r > rc(f ) (see [2]). A correct version of the convergence results of ÎN,k,q to Iq for 0 < q < 1 and f having unbounded support can be obtained by using results on the subadditivity of Euclidean functionals (see [3]) and can be formulated as follows [2]: if Iq < ∞ and rc(f ) > m1−q q , then E[ÎN,k,q] → Iq,N → ∞ (asymptotic unbiasedness; see Theorem 3.1 of [1]); if Iq <∞, q > 2 and rc(f ) > 2m 1−q 2q−1 , then E[ÎN,k,q − Iq]2 → 0,N →∞ (L2 convergence; see Theorem 3.2 of [1]). The situation is simpler when f has bounded support Sf . When f is bounded away from 0 and infinity on Sf and Sf consists of a finite union of convex bounded sets with

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relations between Renyi Distance and Fisher Information

In this paper, we first show that Renyi distance between any member of a parametric family and its perturbations, is proportional to its Fisher information. We, then, prove some relations between the Renyi distance of two distributions and the Fisher information of their exponentially twisted family of densities. Finally, we show that the partial ordering of families induced by Renyi dis...

متن کامل

Testing Exponentiality Based on Renyi Entropy of Transformed Data

In this paper, we introduce new tests for exponentiality based on estimators of Renyi entropy of a continuous random variable. We first consider two transformations of the observations which turn the test of exponentiality into one of uniformity and use a corresponding test based on Renyi entropy. Critical values of the test statistics are computed by Monte Carlo simulations. Then, we compare p...

متن کامل

Effect of Measurement Errors on a Class of Estimators of Population Mean Using Auxiliary Information in Sample Surveys

&nbsp;We consider the problem of estimation the population mean of the study variate Y in presence of measurement errors when information on an auxiliary character X is known. A class of estimators for population means using information on an auxiliary variate X is defined. Expressions for its asymptotic bias and mean square error are obtained. Optimum conditions are obtained for which the mean...

متن کامل

Quasi-Concave Density Estimation

Maximum likelihood estimation of a log-concave probability density is formulated as a convex optimization problem and shown to have an equivalent dual formulation as a constrained maximum Shannon entropy problem. Closely related maximum Renyi entropy estimators that impose weaker concavity restrictions on the fitted density are also considered, notably a minimum Hellinger discrepancy estimator ...

متن کامل

Nonparametric Estimation of Renyi Divergence and Friends

We consider nonparametric estimation of L2, Rényi-α and Tsallis-α divergences between continuous distributions. Our approach is to construct estimators for particular integral functionals of two densities and translate them into divergence estimators. For the integral functionals, our estimators are based on corrections of a preliminary plug-in estimator. We show that these estimators achieve t...

متن کامل

M-estimators as GMM for Stable Laws Discretizations

This paper is devoted to &quot;Some Discrete Distributions Generated by Standard Stable Densities&quot; (in short, Discrete Stable Densities). The large-sample properties of M-estimators as obtained by the &quot;Generalized Method of Moments&quot; (GMM) are discussed for such distributions. Some corollaries are proposed. Moreover, using the respective results we demonstrate the large-sample pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005